Context Encoding LSTM CS224N Course Project

نویسندگان

  • Abhinav Rastogi
  • Samuel R. Bowman
چکیده

This project uses ideas from greedy transition based parsing to build neural network models that can jointly learn to parse sentences and use those parses to guide semantic composition. The model is used for sentence encoding for tasks like Sentiment classification and Entailment. The performance is evaluated on Stanford Sentiment Treebank(SST) and Stanford Natural Language Inference (SNLI) corpus.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Awkwardly: A Response Suggester CS224N Final Project

Replying to emails can be a daunting task especially if users are bombarded with hundreds of emails per day, or if they struggle with constructing well-formed socially acceptable replies. We present Awkwardly, a novel response suggester that generates top replies shown to a user in real-time. These short responses can be selected as a reply in a chat or email context. In this paper, we introduc...

متن کامل

CS224n PA4: Extending Match-LSTM

We propose two novel extensions to the Match-LSTM Boundary model for question answering on the SQuAD dataset. First we propose doing attention in the passage and question encoders. Second we propose adding a one-way conditional dependency between start-of-span and end-of-span prediction. In our evaluations, we show that these extensions result in a model that outperforms our implementation of v...

متن کامل

CS224n Assignment 4: Machine Comprehension with Exploration on Attention Mechanism

This goal of this paper is to perform the prediction task on SQuAD dataset about reading comprehension. Given a pair of context paragraph and a question, we’ll output an answer. To do this, a model is built combining the idea of Bidirectional LSTM and attention flow mechanism. The basic architecture and setup details of the model are introduced, so do the summary of performance and error analys...

متن کامل

Future Context Attention for Unidirectional LSTM Based Acoustic Model

Recently, feedforward sequential memory networks (FSMN) has shown strong ability to model past and future long-term dependency in speech signals without using recurrent feedback, and has achieved better performance than BLSTM in acoustic modeling. However, the encoding coefficients in FSMN is context-independent while context-dependent weights are commonly supposed to be more reasonable in acou...

متن کامل

Attention-based Recurrent Neural Networks for Question Answering

Machine Comprehension (MC) of text is an important problem in Natural Language Processing (NLP) research, and the task of Question Answering (QA) is a major way of assessing MC outcomes. One QA dataset that has gained immense popularity recently is the Stanford Question Answering Dataset (SQuAD). Successful models for SQuAD have all involved the use of Recurrent Neural Network (RNN), and most o...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015